learn.kkinn.com - /Courses/ML/[FCO] AppliedAICourse - Applied Machine Learning Course/


[To Parent Directory]

4/19/2023 5:40 AM <dir> 1.1 - How to Learn from Appliedaicourse
4/19/2023 4:24 AM <dir> 1.2 - How the Job Guarantee program works
4/19/2023 5:40 AM <dir> 10.1 - Why learn it
4/19/2023 5:40 AM <dir> 10.10 - Hyper Cube,Hyper Cuboid
4/19/2023 5:40 AM <dir> 10.11 - Revision Questions
4/19/2023 5:40 AM <dir> 10.2 - Introduction to Vectors(2-D, 3-D, n-D) , Row Vector and Column Vector
4/19/2023 5:40 AM <dir> 10.3 - Dot Product and Angle between 2 Vectors
4/19/2023 5:40 AM <dir> 10.4 - Projection and Unit Vector
4/19/2023 5:40 AM <dir> 10.5 - Equation of a line (2-D), Plane(3-D) and Hyperplane (n-D), Plane Passing through origin, Normal to a Plane
4/19/2023 5:40 AM <dir> 10.6 - Distance of a point from a PlaneHyperplane, Half-Spaces
4/19/2023 5:40 AM <dir> 10.7 - Equation of a Circle (2-D), Sphere (3-D) and Hypersphere (n-D)
4/19/2023 5:40 AM <dir> 10.8 - Equation of an Ellipse (2-D), Ellipsoid (3-D) and Hyperellipsoid (n-D)
4/19/2023 5:40 AM <dir> 10.9 - Square ,Rectangle
4/19/2023 5:40 AM <dir> 11.1 - Introduction to Probability and Statistics
4/19/2023 5:40 AM <dir> 11.10 - How distributions are used
4/19/2023 5:40 AM <dir> 11.11 - Chebyshev’s inequality
4/19/2023 5:40 AM <dir> 11.12 - Discrete and Continuous Uniform distributions
4/19/2023 5:40 AM <dir> 11.13 - How to randomly sample data points (Uniform Distribution)
4/19/2023 5:40 AM <dir> 11.14 - Bernoulli and Binomial Distribution
4/19/2023 5:41 AM <dir> 11.15 - Log Normal Distribution
4/19/2023 5:41 AM <dir> 11.16 - Power law distribution
4/19/2023 5:41 AM <dir> 11.17 - Box cox transform
4/19/2023 5:41 AM <dir> 11.18 - Applications of non-gaussian distributions
4/19/2023 5:41 AM <dir> 11.19 - Co-variance
4/19/2023 5:41 AM <dir> 11.2 - Population and Sample
4/19/2023 5:41 AM <dir> 11.20 - Pearson Correlation Coefficient
4/19/2023 5:41 AM <dir> 11.21 - Spearman Rank Correlation Coefficient
4/19/2023 5:41 AM <dir> 11.22 - Correlation vs Causation
4/19/2023 5:41 AM <dir> 11.23 - How to use correlations
4/19/2023 5:41 AM <dir> 11.24 - Confidence interval (C.I) Introduction
4/19/2023 5:41 AM <dir> 11.25 - Computing confidence interval given the underlying distribution
4/19/2023 5:41 AM <dir> 11.26 - C.I for mean of a normal random variable
4/19/2023 5:41 AM <dir> 11.27 - Confidence interval using bootstrapping
4/19/2023 5:41 AM <dir> 11.28 - Hypothesis testing methodology, Null-hypothesis, p-value
4/19/2023 5:41 AM <dir> 11.29 - Hypothesis Testing Intution with coin toss example
4/19/2023 5:41 AM <dir> 11.3 - GaussianNormal Distribution and its PDF(Probability Density Function)
4/19/2023 5:41 AM <dir> 11.30 - Resampling and permutation test
4/19/2023 5:41 AM <dir> 11.31 - K-S Test for similarity of two distributions
4/19/2023 5:41 AM <dir> 11.32 - Code Snippet K-S Test
4/19/2023 5:41 AM <dir> 11.33 - Hypothesis testing another example
4/19/2023 5:41 AM <dir> 11.34 - Resampling and Permutation test another example
4/19/2023 5:41 AM <dir> 11.35 - How to use hypothesis testing
4/19/2023 5:41 AM <dir> 11.36 - Proportional Sampling
4/19/2023 5:41 AM <dir> 11.37 - Revision Questions
4/19/2023 5:41 AM <dir> 11.4 - CDF(Cumulative Distribution function) of GaussianNormal distribution
4/19/2023 5:41 AM <dir> 11.5 - Symmetric distribution, Skewness and Kurtosis
4/19/2023 5:41 AM <dir> 11.6 - Standard normal variate (Z) and standardization
4/19/2023 5:41 AM <dir> 11.7 - Kernel density estimation
4/19/2023 5:41 AM <dir> 11.8 - Sampling distribution & Central Limit theorem
4/19/2023 5:41 AM <dir> 11.9 - Q-Q plotHow to test if a random variable is normally distributed or not
4/19/2023 5:41 AM <dir> 12.1 - Questions & Answers
4/19/2023 5:41 AM <dir> 13.1 - What is Dimensionality reduction
4/19/2023 5:41 AM <dir> 13.10 - Code to Load MNIST Data Set
4/19/2023 5:41 AM <dir> 13.2 - Row Vector and Column Vector
4/19/2023 5:41 AM <dir> 13.3 - How to represent a data set
4/19/2023 5:41 AM <dir> 13.4 - How to represent a dataset as a Matrix
4/19/2023 5:41 AM <dir> 13.5 - Data Preprocessing Feature Normalisation
4/19/2023 5:41 AM <dir> 13.6 - Mean of a data matrix
4/19/2023 5:41 AM <dir> 13.7 - Data Preprocessing Column Standardization
4/19/2023 5:41 AM <dir> 13.8 - Co-variance of a Data Matrix
4/19/2023 5:41 AM <dir> 13.9 - MNIST dataset (784 dimensional)
4/19/2023 5:41 AM <dir> 14.1 - Why learn PCA
4/19/2023 5:41 AM <dir> 14.10 - PCA for dimensionality reduction (not-visualization)
4/19/2023 5:41 AM <dir> 14.2 - Geometric intuition of PCA
4/19/2023 5:41 AM <dir> 14.3 - Mathematical objective function of PCA
4/19/2023 5:41 AM <dir> 14.4 - Alternative formulation of PCA Distance minimization
4/19/2023 5:41 AM <dir> 14.5 - Eigen values and Eigen vectors (PCA) Dimensionality reduction
4/19/2023 5:41 AM <dir> 14.6 - PCA for Dimensionality Reduction and Visualization
4/19/2023 5:41 AM <dir> 14.7 - Visualize MNIST dataset
4/19/2023 5:41 AM <dir> 14.8 - Limitations of PCA
4/19/2023 5:41 AM <dir> 14.9 - PCA Code example
4/19/2023 4:25 AM <dir> 15.1 - What is t-SNE
4/19/2023 5:41 AM <dir> 15.2 - Neighborhood of a point, Embedding
4/19/2023 5:41 AM <dir> 15.3 - Geometric intuition of t-SNE
4/19/2023 5:41 AM <dir> 15.4 - Crowding Problem
4/19/2023 5:41 AM <dir> 15.5 - How to apply t-SNE and interpret its output
4/19/2023 5:41 AM <dir> 15.6 - t-SNE on MNIST
4/19/2023 5:41 AM <dir> 15.7 - Code example of t-SNE
4/19/2023 5:41 AM <dir> 15.8 - Revision Questions
4/19/2023 5:41 AM <dir> 16.1 - Questions & Answers
4/19/2023 5:41 AM <dir> 17.1 - Dataset overview Amazon Fine Food reviews(EDA)
4/19/2023 5:41 AM <dir> 17.10 - Avg-Word2Vec, tf-idf weighted Word2Vec
4/19/2023 5:41 AM <dir> 17.11 - Bag of Words( Code Sample)
4/19/2023 5:41 AM <dir> 17.12 - Text Preprocessing( Code Sample)
4/19/2023 5:41 AM <dir> 17.13 - Bi-Grams and n-grams (Code Sample)
4/19/2023 5:41 AM <dir> 17.14 - TF-IDF (Code Sample)
4/19/2023 5:41 AM <dir> 17.15 - Word2Vec (Code Sample)
4/19/2023 5:41 AM <dir> 17.16 - Avg-Word2Vec and TFIDF-Word2Vec (Code Sample)
4/19/2023 5:41 AM <dir> 17.17 - Assignment-2 Apply t-SNE
4/19/2023 5:41 AM <dir> 17.2 - Data Cleaning Deduplication
4/19/2023 5:41 AM <dir> 17.3 - Why convert text to a vector
4/19/2023 5:41 AM <dir> 17.4 - Bag of Words (BoW)
4/19/2023 5:41 AM <dir> 17.5 - Text Preprocessing Stemming
4/19/2023 5:41 AM <dir> 17.6 - uni-gram, bi-gram, n-grams
4/19/2023 5:41 AM <dir> 17.7 - tf-idf (term frequency- inverse document frequency)
4/19/2023 5:41 AM <dir> 17.8 - Why use log in IDF
4/19/2023 5:41 AM <dir> 17.9 - Word2Vec
4/19/2023 5:41 AM <dir> 18.1 - How “Classification” works
4/19/2023 5:41 AM <dir> 18.10 - KNN Limitations
4/19/2023 5:41 AM <dir> 18.11 - Decision surface for K-NN as K changes
4/19/2023 5:41 AM <dir> 18.12 - Overfitting and Underfitting
4/19/2023 5:41 AM <dir> 18.13 - Need for Cross validation
4/19/2023 5:41 AM <dir> 18.14 - K-fold cross validation
4/19/2023 5:41 AM <dir> 18.15 - Visualizing train, validation and test datasets
4/19/2023 5:41 AM <dir> 18.16 - How to determine overfitting and underfitting
4/19/2023 5:41 AM <dir> 18.17 - Time based splitting
4/19/2023 5:41 AM <dir> 18.18 - k-NN for regression
4/19/2023 5:41 AM <dir> 18.19 - Weighted k-NN
4/19/2023 5:41 AM <dir> 18.2 - Data matrix notation
4/19/2023 5:41 AM <dir> 18.20 - Voronoi diagram
4/19/2023 5:41 AM <dir> 18.21 - Binary search tree
4/19/2023 5:41 AM <dir> 18.22 - How to build a kd-tree
4/19/2023 5:41 AM <dir> 18.23 - Find nearest neighbours using kd-tree
4/19/2023 5:41 AM <dir> 18.24 - Limitations of Kd tree
4/19/2023 5:41 AM <dir> 18.25 - Extensions
4/19/2023 5:41 AM <dir> 18.26 - Hashing vs LSH
4/19/2023 5:41 AM <dir> 18.27 - LSH for cosine similarity
4/19/2023 5:41 AM <dir> 18.28 - LSH for euclidean distance
4/19/2023 5:41 AM <dir> 18.29 - Probabilistic class label
4/19/2023 5:41 AM <dir> 18.3 - Classification vs Regression (examples)
4/19/2023 5:41 AM <dir> 18.30 - Code SampleDecision boundary
4/19/2023 5:41 AM <dir> 18.31 - Code SampleCross Validation
4/19/2023 5:41 AM <dir> 18.32 - Revision Questions
4/19/2023 5:41 AM <dir> 18.4 - K-Nearest Neighbours Geometric intuition with a toy example
4/19/2023 5:41 AM <dir> 18.5 - Failure cases of KNN
4/19/2023 5:41 AM <dir> 18.6 - Distance measures Euclidean(L2) , Manhattan(L1), Minkowski, Hamming
4/19/2023 5:41 AM <dir> 18.7 - Cosine Distance & Cosine Similarity
4/19/2023 4:25 AM <dir> 18.8 - How to measure the effectiveness of k-NN
4/19/2023 5:41 AM <dir> 18.9 - TestEvaluation time and space complexity
4/19/2023 5:41 AM <dir> 19.1 - Questions & Answers
4/19/2023 5:41 AM <dir> 2.1 - Python, Anaconda and relevant packages installations
4/19/2023 5:41 AM <dir> 2.10 - Control flow for loop
4/19/2023 5:41 AM <dir> 2.11 - Control flow break and continue
4/19/2023 5:41 AM <dir> 2.2 - Why learn Python
4/19/2023 5:41 AM <dir> 2.3 - Keywords and identifiers
4/19/2023 5:41 AM <dir> 2.4 - comments, indentation and statements
4/19/2023 5:41 AM <dir> 2.5 - Variables and data types in Python
4/19/2023 5:41 AM <dir> 2.6 - Standard Input and Output
4/19/2023 5:41 AM <dir> 2.7 - Operators
4/19/2023 5:41 AM <dir> 2.8 - Control flow if else
4/19/2023 5:41 AM <dir> 2.9 - Control flow while loop
4/19/2023 5:41 AM <dir> 20.1 - Introduction
4/19/2023 5:41 AM <dir> 20.10 - Local reachability-density(A)
4/19/2023 5:42 AM <dir> 20.11 - Local outlier Factor(A)
4/19/2023 5:42 AM <dir> 20.12 - Impact of Scale & Column standardization
4/19/2023 5:42 AM <dir> 20.13 - Interpretability
4/19/2023 5:42 AM <dir> 20.14 - Feature Importance and Forward Feature selection
4/19/2023 5:42 AM <dir> 20.15 - Handling categorical and numerical features
4/19/2023 5:42 AM <dir> 20.16 - Handling missing values by imputation
4/19/2023 5:42 AM <dir> 20.17 - curse of dimensionality
4/19/2023 5:42 AM <dir> 20.18 - Bias-Variance tradeoff
4/19/2023 5:42 AM <dir> 20.19 - Intuitive understanding of bias-variance
4/19/2023 5:42 AM <dir> 20.2 - Imbalanced vs balanced dataset
4/19/2023 5:42 AM <dir> 20.20 - Revision Questions
4/19/2023 5:42 AM <dir> 20.21 - best and wrost case of algorithm
4/19/2023 5:42 AM <dir> 20.3 - Multi-class classification
4/19/2023 5:42 AM <dir> 20.4 - k-NN, given a distance or similarity matrix
4/19/2023 5:42 AM <dir> 20.5 - Train and test set differences
4/19/2023 5:42 AM <dir> 20.6 - Impact of outliers
4/19/2023 5:42 AM <dir> 20.7 - Local outlier Factor (Simple solution Mean distance to Knn)
4/19/2023 5:42 AM <dir> 20.8 - k distance
4/19/2023 5:42 AM <dir> 20.9 - Reachability-Distance(A,B)
4/19/2023 5:42 AM <dir> 21.1 - Accuracy
4/19/2023 5:42 AM <dir> 21.10 - Revision Questions
4/19/2023 5:42 AM <dir> 21.2 - Confusion matrix, TPR, FPR, FNR, TNR
4/19/2023 5:42 AM <dir> 21.3 - Precision and recall, F1-score
4/19/2023 5:42 AM <dir> 21.4 - Receiver Operating Characteristic Curve (ROC) curve and AUC
4/19/2023 5:42 AM <dir> 21.5 - Log-loss
4/19/2023 5:42 AM <dir> 21.6 - R-SquaredCoefficient of determination
4/19/2023 5:42 AM <dir> 21.7 - Median absolute deviation (MAD)
4/19/2023 5:42 AM <dir> 21.8 - Distribution of errors
4/19/2023 5:42 AM <dir> 21.9 - Assignment-3 Apply k-Nearest Neighbor
4/19/2023 5:42 AM <dir> 22.1 - Questions & Answers
4/19/2023 5:42 AM <dir> 23.1 - Conditional probability
4/19/2023 5:42 AM <dir> 23.10 - Bias and Variance tradeoff
4/19/2023 5:42 AM <dir> 23.11 - Feature importance and interpretability
4/19/2023 5:42 AM <dir> 23.12 - Imbalanced data
4/19/2023 5:42 AM <dir> 23.13 - Outliers
4/19/2023 5:42 AM <dir> 23.14 - Missing values
4/19/2023 5:42 AM <dir> 23.15 - Handling Numerical features (Gaussian NB)
4/19/2023 5:42 AM <dir> 23.16 - Multiclass classification
4/19/2023 5:42 AM <dir> 23.17 - Similarity or Distance matrix
4/19/2023 5:42 AM <dir> 23.18 - Large dimensionality
4/19/2023 5:42 AM <dir> 23.19 - Best and worst cases
4/19/2023 5:42 AM <dir> 23.2 - Independent vs Mutually exclusive events
4/19/2023 5:42 AM <dir> 23.20 - Code example
4/19/2023 5:42 AM <dir> 23.21 - Assignment-4 Apply Naive Bayes
4/19/2023 5:42 AM <dir> 23.22 - Revision Questions
4/19/2023 5:42 AM <dir> 23.3 - Bayes Theorem with examples
4/19/2023 5:42 AM <dir> 23.4 - Exercise problems on Bayes Theorem
4/19/2023 5:42 AM <dir> 23.5 - Naive Bayes algorithm
4/19/2023 4:26 AM <dir> 23.6 - Toy example Train and test stages
4/19/2023 5:42 AM <dir> 23.7 - Naive Bayes on Text data
4/19/2023 5:42 AM <dir> 23.8 - LaplaceAdditive Smoothing
4/19/2023 5:42 AM <dir> 23.9 - Log-probabilities for numerical stability
4/19/2023 5:42 AM <dir> 24.1 - Geometric intuition of Logistic Regression
4/19/2023 5:42 AM <dir> 24.10 - Column Standardization
4/19/2023 5:42 AM <dir> 24.11 - Feature importance and Model interpretability
4/19/2023 5:42 AM <dir> 24.12 - Collinearity of features
4/19/2023 5:42 AM <dir> 24.13 - TestRun time space and time complexity
4/19/2023 5:42 AM <dir> 24.14 - Real world cases
4/19/2023 5:42 AM <dir> 24.15 - Non-linearly separable data & feature engineering
4/19/2023 5:42 AM <dir> 24.16 - Code sample Logistic regression, GridSearchCV, RandomSearchCV
4/19/2023 5:42 AM <dir> 24.17 - Assignment-5 Apply Logistic Regression
4/19/2023 5:42 AM <dir> 24.18 - Extensions to Generalized linear models
4/19/2023 5:42 AM <dir> 24.2 - Sigmoid function Squashing
4/19/2023 5:42 AM <dir> 24.3 - Mathematical formulation of Objective function
4/19/2023 5:42 AM <dir> 24.4 - Weight vector
4/19/2023 5:42 AM <dir> 24.5 - L2 Regularization Overfitting and Underfitting
4/19/2023 5:42 AM <dir> 24.6 - L1 regularization and sparsity
4/19/2023 5:42 AM <dir> 24.7 - Probabilistic Interpretation Gaussian Naive Bayes
4/19/2023 5:42 AM <dir> 24.8 - Loss minimization interpretation
4/19/2023 5:42 AM <dir> 24.9 - hyperparameters and random search
4/19/2023 5:42 AM <dir> 25.1 - Geometric intuition of Linear Regression
4/19/2023 5:42 AM <dir> 25.2 - Mathematical formulation
4/19/2023 5:42 AM <dir> 25.3 - Real world Cases
4/19/2023 5:42 AM <dir> 25.4 - Code sample for Linear Regression
4/19/2023 5:42 AM <dir> 26.1 - Differentiation
4/19/2023 5:42 AM <dir> 26.10 - Logistic regression formulation revisited
4/19/2023 5:42 AM <dir> 26.11 - Why L1 regularization creates sparsity
4/19/2023 5:42 AM <dir> 26.12 - Assignment 6 Implement SGD for linear regression
4/19/2023 5:42 AM <dir> 26.13 - Revision questions
4/19/2023 5:42 AM <dir> 26.2 - Online differentiation tools
4/19/2023 5:42 AM <dir> 26.3 - Maxima and Minima
4/19/2023 5:42 AM <dir> 26.4 - Vector calculus Grad
4/19/2023 5:42 AM <dir> 26.5 - Gradient descent geometric intuition
4/19/2023 5:42 AM <dir> 26.6 - Learning rate
4/19/2023 5:42 AM <dir> 26.7 - Gradient descent for linear regression
4/19/2023 5:42 AM <dir> 26.8 - SGD algorithm
4/19/2023 5:42 AM <dir> 26.9 - Constrained Optimization & PCA
4/19/2023 5:42 AM <dir> 27.1 - Questions & Answers
4/19/2023 5:42 AM <dir> 28.1 - Geometric Intution
4/19/2023 5:42 AM <dir> 28.10 - Train and run time complexities
4/19/2023 5:42 AM <dir> 28.11 - nu-SVM control errors and support vectors
4/19/2023 5:42 AM <dir> 28.12 - SVM Regression
4/19/2023 5:42 AM <dir> 28.13 - Cases
4/19/2023 5:42 AM <dir> 28.14 - Code Sample
4/19/2023 5:42 AM <dir> 28.15 - Assignment-7 Apply SVM
4/19/2023 5:42 AM <dir> 28.16 - Revision Questions
4/19/2023 5:42 AM <dir> 28.2 - Mathematical derivation
4/19/2023 5:42 AM <dir> 28.3 - Why we take values +1 and and -1 for Support vector planes
4/19/2023 5:42 AM <dir> 28.4 - Loss function (Hinge Loss) based interpretation
4/19/2023 5:42 AM <dir> 28.5 - Dual form of SVM formulation
4/19/2023 5:42 AM <dir> 28.6 - kernel trick
4/19/2023 5:42 AM <dir> 28.7 - Polynomial Kernel
4/19/2023 4:26 AM <dir> 28.8 - RBF-Kernel
4/19/2023 5:42 AM <dir> 28.9 - Domain specific Kernels
4/19/2023 5:42 AM <dir> 29.1 - Questions & Answers
4/19/2023 5:42 AM <dir> 3.1 - Lists
4/19/2023 5:42 AM <dir> 3.2 - Tuples part 1
4/19/2023 5:42 AM <dir> 3.3 - Tuples part-2
4/19/2023 5:42 AM <dir> 3.4 - Sets
4/19/2023 5:42 AM <dir> 3.5 - Dictionary
4/19/2023 5:42 AM <dir> 3.6 - Strings
4/19/2023 5:42 AM <dir> 30.1 - Geometric Intuition of decision tree Axis parallel hyperplanes
4/19/2023 5:42 AM <dir> 30.10 - Overfitting and Underfitting
4/19/2023 5:43 AM <dir> 30.11 - Train and Run time complexity
4/19/2023 5:43 AM <dir> 30.12 - Regression using Decision Trees
4/19/2023 5:43 AM <dir> 30.13 - Cases
4/19/2023 5:43 AM <dir> 30.14 - Code Samples
4/19/2023 5:43 AM <dir> 30.15 - Assignment-8 Apply Decision Trees
4/19/2023 5:43 AM <dir> 30.16 - Revision Questions
4/19/2023 5:43 AM <dir> 30.2 - Sample Decision tree
4/19/2023 5:43 AM <dir> 30.3 - Building a decision TreeEntropy
4/19/2023 5:43 AM <dir> 30.4 - Building a decision TreeInformation Gain
4/19/2023 5:43 AM <dir> 30.5 - Building a decision Tree Gini Impurity
4/19/2023 5:43 AM <dir> 30.6 - Building a decision Tree Constructing a DT
4/19/2023 5:43 AM <dir> 30.7 - Building a decision Tree Splitting numerical features
4/19/2023 5:43 AM <dir> 30.8 - Feature standardization
4/19/2023 5:43 AM <dir> 30.9 - Building a decision TreeCategorical features with many possible values
4/19/2023 5:43 AM <dir> 31.1 - Questions & Answers
4/19/2023 5:43 AM <dir> 32.1 - What are ensembles
4/19/2023 5:43 AM <dir> 32.10 - Residuals, Loss functions and gradients
4/19/2023 5:43 AM <dir> 32.11 - Gradient Boosting
4/19/2023 5:43 AM <dir> 32.12 - Regularization by Shrinkage
4/19/2023 5:43 AM <dir> 32.13 - Train and Run time complexity
4/19/2023 5:43 AM <dir> 32.14 - XGBoost Boosting + Randomization
4/19/2023 5:43 AM <dir> 32.15 - AdaBoost geometric intuition
4/19/2023 5:43 AM <dir> 32.16 - Stacking models
4/19/2023 5:43 AM <dir> 32.17 - Cascading classifiers
4/19/2023 5:43 AM <dir> 32.18 - Kaggle competitions vs Real world
4/19/2023 5:43 AM <dir> 32.19 - Assignment-9 Apply Random Forests & GBDT
4/19/2023 5:43 AM <dir> 32.2 - Bootstrapped Aggregation (Bagging) Intuition
4/19/2023 5:43 AM <dir> 32.20 - Revision Questions
4/19/2023 5:43 AM <dir> 32.3 - Random Forest and their construction
4/19/2023 5:43 AM <dir> 32.4 - Bias-Variance tradeoff
4/19/2023 5:43 AM <dir> 32.5 - Train and run time complexity
4/19/2023 5:43 AM <dir> 32.6 - BaggingCode Sample
4/19/2023 5:43 AM <dir> 32.7 - Extremely randomized trees
4/19/2023 5:43 AM <dir> 32.8 - Random Tree Cases
4/19/2023 5:43 AM <dir> 32.9 - Boosting Intuition
4/19/2023 5:43 AM <dir> 33.1 - Introduction
4/19/2023 5:43 AM <dir> 33.10 - Indicator variables
4/19/2023 5:43 AM <dir> 33.11 - Feature binning
4/19/2023 5:43 AM <dir> 33.12 - Interaction variables
4/19/2023 5:43 AM <dir> 33.13 - Mathematical transforms
4/19/2023 5:43 AM <dir> 33.14 - Model specific featurizations
4/19/2023 5:43 AM <dir> 33.15 - Feature orthogonality
4/19/2023 5:43 AM <dir> 33.16 - Domain specific featurizations
4/19/2023 5:43 AM <dir> 33.17 - Feature slicing
4/19/2023 5:43 AM <dir> 33.18 - Kaggle Winners solutions
4/19/2023 5:43 AM <dir> 33.2 - Moving window for Time Series Data
4/19/2023 5:43 AM <dir> 33.3 - Fourier decomposition
4/19/2023 5:43 AM <dir> 33.4 - Deep learning features LSTM
4/19/2023 5:43 AM <dir> 33.5 - Image histogram
4/19/2023 5:43 AM <dir> 33.6 - Keypoints SIFT
4/19/2023 5:43 AM <dir> 33.7 - Deep learning features CNN
4/19/2023 5:43 AM <dir> 33.8 - Relational data
4/19/2023 5:43 AM <dir> 33.9 - Graph data
4/19/2023 5:43 AM <dir> 34.1 - Calibration of ModelsNeed for calibration
4/19/2023 5:43 AM <dir> 34.10 - AB testing
4/19/2023 4:27 AM <dir> 34.11 - Data Science Life cycle
4/19/2023 5:43 AM <dir> 34.12 - VC dimension
4/19/2023 5:43 AM <dir> 34.2 - Productionization and deployment of Machine Learning Models
4/19/2023 5:43 AM <dir> 34.3 - Calibration Plots
4/19/2023 5:43 AM <dir> 34.4 - Platt’s CalibrationScaling
4/19/2023 5:43 AM <dir> 34.5 - Isotonic Regression
4/19/2023 5:43 AM <dir> 34.6 - Code Samples
4/19/2023 5:43 AM <dir> 34.7 - Modeling in the presence of outliers RANSAC
4/19/2023 5:43 AM <dir> 34.8 - Productionizing models
4/19/2023 5:43 AM <dir> 34.9 - Retraining models periodically
4/19/2023 5:43 AM <dir> 35.1 - What is Clustering
4/19/2023 5:43 AM <dir> 35.10 - K-Medoids
4/19/2023 5:43 AM <dir> 35.11 - Determining the right K
4/19/2023 5:43 AM <dir> 35.12 - Code Samples
4/19/2023 5:43 AM <dir> 35.13 - Time and space complexity
4/19/2023 5:43 AM <dir> 35.14 - Assignment-10 Apply K-means, Agglomerative, DBSCAN clustering algorithms
4/19/2023 5:43 AM <dir> 35.2 - Unsupervised learning
4/19/2023 5:43 AM <dir> 35.3 - Applications
4/19/2023 5:43 AM <dir> 35.4 - Metrics for Clustering
4/19/2023 5:43 AM <dir> 35.5 - K-Means Geometric intuition, Centroids
4/19/2023 5:43 AM <dir> 35.6 - K-Means Mathematical formulation Objective function
4/19/2023 5:43 AM <dir> 35.7 - K-Means Algorithm
4/19/2023 5:43 AM <dir> 35.8 - How to initialize K-Means++
4/19/2023 5:43 AM <dir> 35.9 - Failure casesLimitations
4/19/2023 5:43 AM <dir> 36.1 - Agglomerative & Divisive, Dendrograms
4/19/2023 5:43 AM <dir> 36.2 - Agglomerative Clustering
4/19/2023 5:43 AM <dir> 36.3 - Proximity methods Advantages and Limitations
4/19/2023 5:43 AM <dir> 36.4 - Time and Space Complexity
4/19/2023 5:43 AM <dir> 36.5 - Limitations of Hierarchical Clustering
4/19/2023 5:43 AM <dir> 36.6 - Code sample
4/19/2023 5:43 AM <dir> 36.7 - Assignment-10 Apply K-means, Agglomerative, DBSCAN clustering algorithms
4/19/2023 5:43 AM <dir> 37.1 - Density based clustering
4/19/2023 5:43 AM <dir> 37.10 - Assignment-10 Apply K-means, Agglomerative, DBSCAN clustering algorithms
4/19/2023 5:43 AM <dir> 37.11 - Revision Questions
4/19/2023 5:43 AM <dir> 37.2 - MinPts and Eps Density
4/19/2023 5:43 AM <dir> 37.3 - Core, Border and Noise points
4/19/2023 5:43 AM <dir> 37.4 - Density edge and Density connected points
4/19/2023 5:43 AM <dir> 37.5 - DBSCAN Algorithm
4/19/2023 5:43 AM <dir> 37.6 - Hyper Parameters MinPts and Eps
4/19/2023 5:43 AM <dir> 37.7 - Advantages and Limitations of DBSCAN
4/19/2023 5:43 AM <dir> 37.8 - Time and Space Complexity
4/19/2023 5:43 AM <dir> 37.9 - Code samples
4/19/2023 5:43 AM <dir> 38.1 - Problem formulation Movie reviews
4/19/2023 5:43 AM <dir> 38.10 - Matrix Factorization for recommender systems Netflix Prize Solution
4/19/2023 5:43 AM <dir> 38.11 - Cold Start problem
4/19/2023 5:43 AM <dir> 38.12 - Word vectors as MF
4/19/2023 5:43 AM <dir> 38.13 - Eigen-Faces
4/19/2023 5:43 AM <dir> 38.14 - Code example
4/19/2023 5:43 AM <dir> 38.15 - Assignment-11 Apply Truncated SVD
4/19/2023 5:43 AM <dir> 38.16 - Revision Questions
4/19/2023 5:43 AM <dir> 38.2 - Content based vs Collaborative Filtering
4/19/2023 5:43 AM <dir> 38.3 - Similarity based Algorithms
4/19/2023 5:43 AM <dir> 38.4 - Matrix Factorization PCA, SVD
4/19/2023 5:43 AM <dir> 38.5 - Matrix Factorization NMF
4/19/2023 5:43 AM <dir> 38.6 - Matrix Factorization for Collaborative filtering
4/19/2023 5:43 AM <dir> 38.7 - Matrix Factorization for feature engineering
4/19/2023 5:43 AM <dir> 38.8 - Clustering as MF
4/19/2023 5:43 AM <dir> 38.9 - Hyperparameter tuning
4/19/2023 5:43 AM <dir> 39.1 - Questions & Answers
4/19/2023 5:43 AM <dir> 4.1 - Introduction
4/19/2023 4:27 AM <dir> 4.10 - Debugging Python
4/19/2023 5:43 AM <dir> 4.2 - Types of functions
4/19/2023 5:43 AM <dir> 4.3 - Function arguments
4/19/2023 5:43 AM <dir> 4.4 - Recursive functions
4/19/2023 5:43 AM <dir> 4.5 - Lambda functions
4/19/2023 5:43 AM <dir> 4.6 - Modules
4/19/2023 5:43 AM <dir> 4.7 - Packages
4/19/2023 5:43 AM <dir> 4.8 - File Handling
4/19/2023 5:43 AM <dir> 4.9 - Exception Handling
4/19/2023 5:43 AM <dir> 40.1 - BusinessReal world problem
4/19/2023 5:43 AM <dir> 40.10 - Data Modeling Multi label Classification
4/19/2023 5:44 AM <dir> 40.11 - Data preparation
4/19/2023 5:44 AM <dir> 40.12 - Train-Test Split
4/19/2023 5:44 AM <dir> 40.13 - Featurization
4/19/2023 5:44 AM <dir> 40.14 - Logistic regression One VS Rest
4/19/2023 5:44 AM <dir> 40.15 - Sampling data and tags+Weighted models
4/19/2023 5:44 AM <dir> 40.16 - Logistic regression revisited
4/19/2023 5:44 AM <dir> 40.17 - Why not use advanced techniques
4/19/2023 5:44 AM <dir> 40.18 - Assignments
4/19/2023 5:44 AM <dir> 40.2 - Business objectives and constraints
4/19/2023 5:44 AM <dir> 40.3 - Mapping to an ML problem Data overview
4/19/2023 5:44 AM <dir> 40.4 - Mapping to an ML problemML problem formulation
4/19/2023 5:44 AM <dir> 40.5 - Mapping to an ML problemPerformance metrics
4/19/2023 5:44 AM <dir> 40.6 - Hamming loss
4/19/2023 5:44 AM <dir> 40.7 - EDAData Loading
4/19/2023 5:44 AM <dir> 40.8 - EDAAnalysis of tags
4/19/2023 5:44 AM <dir> 40.9 - EDAData Preprocessing
4/19/2023 5:44 AM <dir> 41.1 - BusinessReal world problem Problem definition
4/19/2023 5:44 AM <dir> 41.10 - EDA Feature analysis
4/19/2023 5:44 AM <dir> 41.11 - EDA Data Visualization T-SNE
4/19/2023 5:44 AM <dir> 41.12 - EDA TF-IDF weighted Word2Vec featurization
4/19/2023 5:44 AM <dir> 41.13 - ML Models Loading Data
4/19/2023 5:44 AM <dir> 41.14 - ML Models Random Model
4/19/2023 5:44 AM <dir> 41.15 - ML Models Logistic Regression and Linear SVM
4/19/2023 5:44 AM <dir> 41.16 - ML Models XGBoost
4/19/2023 5:44 AM <dir> 41.17 - Assignments
4/19/2023 5:44 AM <dir> 41.2 - Business objectives and constraints
4/19/2023 5:44 AM <dir> 41.3 - Mapping to an ML problem Data overview
4/19/2023 5:44 AM <dir> 41.4 - Mapping to an ML problem ML problem and performance metric
4/19/2023 5:44 AM <dir> 41.5 - Mapping to an ML problem Train-test split
4/19/2023 5:44 AM <dir> 41.6 - EDA Basic Statistics
4/19/2023 5:44 AM <dir> 41.7 - EDA Basic Feature Extraction
4/19/2023 5:44 AM <dir> 41.8 - EDA Text Preprocessing
4/19/2023 5:44 AM <dir> 41.9 - EDA Advanced Feature Extraction
4/19/2023 5:44 AM <dir> 42.1 - Problem Statement Recommend similar apparel products in e-commerce using product descriptions and Images
4/19/2023 5:44 AM <dir> 42.10 - Text Pre-Processing Tokenization and Stop-word removal
4/19/2023 5:44 AM <dir> 42.11 - Stemming
4/19/2023 5:44 AM <dir> 42.12 - Text based product similarity Converting text to an n-D vector bag of words
4/19/2023 5:44 AM <dir> 42.13 - Code for bag of words based product similarity
4/19/2023 5:44 AM <dir> 42.14 - TF-IDF featurizing text based on word-importance
4/19/2023 5:44 AM <dir> 42.15 - Code for TF-IDF based product similarity
4/19/2023 5:44 AM <dir> 42.16 - Code for IDF based product similarity
4/19/2023 5:44 AM <dir> 42.17 - Text Semantics based product similarity Word2Vec(featurizing text based on semantic similarity)
4/19/2023 5:44 AM <dir> 42.18 - Code for Average Word2Vec product similarity
4/19/2023 5:44 AM <dir> 42.19 - TF-IDF weighted Word2Vec
4/19/2023 5:44 AM <dir> 42.2 - Plan of action
4/19/2023 5:44 AM <dir> 42.20 - Code for IDF weighted Word2Vec product similarity
4/19/2023 5:44 AM <dir> 42.21 - Weighted similarity using brand and color
4/19/2023 5:44 AM <dir> 42.22 - Code for weighted similarity
4/19/2023 5:44 AM <dir> 42.23 - Building a real world solution
4/19/2023 5:44 AM <dir> 42.24 - Deep learning based visual product similarityConvNets How to featurize an image edges, shapes, parts
4/19/2023 5:44 AM <dir> 42.25 - Using Keras + Tensorflow to extract features
4/19/2023 5:44 AM <dir> 42.26 - Visual similarity based product similarity
4/19/2023 5:44 AM <dir> 42.27 - Measuring goodness of our solution AB testing
4/19/2023 5:44 AM <dir> 42.28 - Exercise Build a weighted Nearest neighbor model using Visual, Text, Brand and Color
4/19/2023 5:44 AM <dir> 42.3 - Amazon product advertising API
4/19/2023 4:28 AM <dir> 42.4 - Data folders and paths
4/19/2023 5:44 AM <dir> 42.5 - Overview of the data and Terminology
4/19/2023 5:44 AM <dir> 42.6 - Data cleaning and understandingMissing data in various features
4/19/2023 5:44 AM <dir> 42.7 - Understand duplicate rows
4/19/2023 5:44 AM <dir> 42.8 - Remove duplicates Part 1
4/19/2023 5:44 AM <dir> 42.9 - Remove duplicates Part 2
4/19/2023 5:44 AM <dir> 43.1 - Businessreal world problem Problem definition
4/19/2023 5:44 AM <dir> 43.10 - ML models – using byte files only Random Model
4/19/2023 5:44 AM <dir> 43.11 - k-NN
4/19/2023 5:44 AM <dir> 43.12 - Logistic regression
4/19/2023 5:44 AM <dir> 43.13 - Random Forest and Xgboost
4/19/2023 5:44 AM <dir> 43.14 - ASM Files Feature extraction & Multiprocessing
4/19/2023 5:44 AM <dir> 43.15 - File-size feature
4/19/2023 5:44 AM <dir> 43.16 - Univariate analysis
4/19/2023 5:44 AM <dir> 43.17 - t-SNE analysis
4/19/2023 5:44 AM <dir> 43.18 - ML models on ASM file features
4/19/2023 5:44 AM <dir> 43.19 - Models on all features t-SNE
4/19/2023 5:44 AM <dir> 43.2 - Businessreal world problem Objectives and constraints
4/19/2023 5:44 AM <dir> 43.20 - Models on all features RandomForest and Xgboost
4/19/2023 5:44 AM <dir> 43.21 - Assignments
4/19/2023 5:44 AM <dir> 43.3 - Machine Learning problem mapping Data overview
4/19/2023 5:44 AM <dir> 43.4 - Machine Learning problem mapping ML problem
4/19/2023 5:44 AM <dir> 43.5 - Machine Learning problem mapping Train and test splitting
4/19/2023 5:44 AM <dir> 43.6 - Exploratory Data Analysis Class distribution
4/19/2023 5:44 AM <dir> 43.7 - Exploratory Data Analysis Feature extraction from byte files
4/19/2023 5:44 AM <dir> 43.8 - Exploratory Data Analysis Multivariate analysis of features from byte files
4/19/2023 5:44 AM <dir> 43.9 - Exploratory Data Analysis Train-Test class distribution
4/19/2023 5:44 AM <dir> 44.1 - BusinessReal world problemProblem definition
4/19/2023 5:44 AM <dir> 44.10 - Exploratory Data AnalysisCold start problem
4/19/2023 5:44 AM <dir> 44.11 - Computing Similarity matricesUser-User similarity matrix
4/19/2023 5:44 AM <dir> 44.12 - Computing Similarity matricesMovie-Movie similarity
4/19/2023 5:44 AM <dir> 44.13 - Computing Similarity matricesDoes movie-movie similarity work
4/19/2023 5:44 AM <dir> 44.14 - ML ModelsSurprise library
4/19/2023 5:44 AM <dir> 44.15 - Overview of the modelling strategy
4/19/2023 5:44 AM <dir> 44.16 - Data Sampling
4/19/2023 5:44 AM <dir> 44.17 - Google drive with intermediate files
4/19/2023 5:44 AM <dir> 44.18 - Featurizations for regression
4/19/2023 5:44 AM <dir> 44.19 - Data transformation for Surprise
4/19/2023 5:44 AM <dir> 44.2 - Objectives and constraints
4/19/2023 5:44 AM <dir> 44.20 - Xgboost with 13 features
4/19/2023 5:44 AM <dir> 44.21 - Surprise Baseline model
4/19/2023 5:44 AM <dir> 44.22 - Xgboost + 13 features +Surprise baseline model
4/19/2023 5:44 AM <dir> 44.23 - Surprise KNN predictors
4/19/2023 5:44 AM <dir> 44.24 - Matrix Factorization models using Surprise
4/19/2023 5:44 AM <dir> 44.25 - SVD ++ with implicit feedback
4/19/2023 5:44 AM <dir> 44.26 - Final models with all features and predictors
4/19/2023 5:44 AM <dir> 44.27 - Comparison between various models
4/19/2023 5:44 AM <dir> 44.28 - Assignments
4/19/2023 5:44 AM <dir> 44.3 - Mapping to an ML problemData overview
4/19/2023 5:44 AM <dir> 44.4 - Mapping to an ML problemML problem formulation
4/19/2023 5:44 AM <dir> 44.5 - Exploratory Data AnalysisData preprocessing
4/19/2023 5:44 AM <dir> 44.6 - Exploratory Data AnalysisTemporal Train-Test split
4/19/2023 5:44 AM <dir> 44.7 - Exploratory Data AnalysisPreliminary data analysis
4/19/2023 5:44 AM <dir> 44.8 - Exploratory Data AnalysisSparse matrix representation
4/19/2023 5:44 AM <dir> 44.9 - Exploratory Data AnalysisAverage ratings for various slices
4/19/2023 5:44 AM <dir> 45.1 - BusinessReal world problem Overview
4/19/2023 5:44 AM <dir> 45.10 - Univariate AnalysisVariation Feature
4/19/2023 5:44 AM <dir> 45.11 - Univariate AnalysisText feature
4/19/2023 5:44 AM <dir> 45.12 - Machine Learning ModelsData preparation
4/19/2023 5:44 AM <dir> 45.13 - Baseline Model Naive Bayes
4/19/2023 5:44 AM <dir> 45.14 - K-Nearest Neighbors Classification
4/19/2023 5:44 AM <dir> 45.15 - Logistic Regression with class balancing
4/19/2023 5:44 AM <dir> 45.16 - Logistic Regression without class balancing
4/19/2023 5:44 AM <dir> 45.17 - Linear-SVM
4/19/2023 5:44 AM <dir> 45.18 - Random-Forest with one-hot encoded features
4/19/2023 5:44 AM <dir> 45.19 - Random-Forest with response-coded features
4/19/2023 5:44 AM <dir> 45.2 - Business objectives and constraints
4/19/2023 5:44 AM <dir> 45.20 - Stacking Classifier
4/19/2023 4:28 AM <dir> 45.21 - Majority Voting classifier
4/19/2023 5:44 AM <dir> 45.22 - Assignments
4/19/2023 5:44 AM <dir> 45.3 - ML problem formulation Data
4/19/2023 5:44 AM <dir> 45.4 - ML problem formulation Mapping real world to ML problem
4/19/2023 5:44 AM <dir> 45.4 - ML problem formulation Mapping real world to ML problem#
4/19/2023 5:44 AM <dir> 45.5 - ML problem formulation Train, CV and Test data construction
4/19/2023 5:44 AM <dir> 45.6 - Exploratory Data AnalysisReading data & preprocessing
4/19/2023 5:44 AM <dir> 45.7 - Exploratory Data AnalysisDistribution of Class-labels
4/19/2023 5:44 AM <dir> 45.8 - Exploratory Data Analysis “Random” Model
4/19/2023 5:44 AM <dir> 45.9 - Univariate AnalysisGene feature
4/19/2023 5:44 AM <dir> 46.1 - BusinessReal world problem Overview
4/19/2023 5:44 AM <dir> 46.10 - Data Cleaning Speed
4/19/2023 5:45 AM <dir> 46.11 - Data Cleaning Distance
4/19/2023 5:45 AM <dir> 46.12 - Data Cleaning Fare
4/19/2023 5:45 AM <dir> 46.13 - Data Cleaning Remove all outlierserroneous points
4/19/2023 5:45 AM <dir> 46.14 - Data PreparationClusteringSegmentation
4/19/2023 5:45 AM <dir> 46.15 - Data PreparationTime binning
4/19/2023 5:45 AM <dir> 46.16 - Data PreparationSmoothing time-series data
4/19/2023 5:45 AM <dir> 46.17 - Data PreparationSmoothing time-series data cont
4/19/2023 5:45 AM <dir> 46.18 - Data Preparation Time series and Fourier transforms
4/19/2023 5:45 AM <dir> 46.19 - Ratios and previous-time-bin values
4/19/2023 5:45 AM <dir> 46.2 - Objectives and Constraints
4/19/2023 5:45 AM <dir> 46.20 - Simple moving average
4/19/2023 5:45 AM <dir> 46.21 - Weighted Moving average
4/19/2023 5:45 AM <dir> 46.22 - Exponential weighted moving average
4/19/2023 5:45 AM <dir> 46.23 - Results
4/19/2023 5:45 AM <dir> 46.24 - Regression models Train-Test split & Features
4/19/2023 5:45 AM <dir> 46.25 - Linear regression
4/19/2023 5:45 AM <dir> 46.26 - Random Forest regression
4/19/2023 5:45 AM <dir> 46.27 - Xgboost Regression
4/19/2023 5:45 AM <dir> 46.28 - Model comparison
4/19/2023 5:45 AM <dir> 46.29 - Assignment
4/19/2023 5:45 AM <dir> 46.3 - Mapping to ML problem Data
4/19/2023 5:45 AM <dir> 46.4 - Mapping to ML problem dask dataframes
4/19/2023 5:45 AM <dir> 46.5 - Mapping to ML problem FieldsFeatures
4/19/2023 5:45 AM <dir> 46.6 - Mapping to ML problem Time series forecastingRegression
4/19/2023 5:45 AM <dir> 46.7 - Mapping to ML problem Performance metrics
4/19/2023 5:45 AM <dir> 46.8 - Data Cleaning Latitude and Longitude data
4/19/2023 5:45 AM <dir> 46.9 - Data Cleaning Trip Duration
4/19/2023 5:45 AM <dir> 47.1 - History of Neural networks and Deep Learning
4/19/2023 5:45 AM <dir> 47.10 - Backpropagation
4/19/2023 5:45 AM <dir> 47.11 - Activation functions
4/19/2023 5:45 AM <dir> 47.12 - Vanishing Gradient problem
4/19/2023 5:45 AM <dir> 47.13 - Bias-Variance tradeoff
4/19/2023 5:45 AM <dir> 47.14 - Decision surfaces Playground
4/19/2023 5:45 AM <dir> 47.2 - How Biological Neurons work
4/19/2023 5:45 AM <dir> 47.3 - Growth of biological neural networks
4/19/2023 5:45 AM <dir> 47.4 - Diagrammatic representation Logistic Regression and Perceptron
4/19/2023 5:45 AM <dir> 47.5 - Multi-Layered Perceptron (MLP)
4/19/2023 5:45 AM <dir> 47.6 - Notation
4/19/2023 5:45 AM <dir> 47.7 - Training a single-neuron model
4/19/2023 5:45 AM <dir> 47.8 - Training an MLP Chain Rule
4/19/2023 5:45 AM <dir> 47.9 - Training an MLPMemoization
4/19/2023 5:45 AM <dir> 48.1 - Deep Multi-layer perceptrons1980s to 2010s
4/19/2023 5:45 AM <dir> 48.10 - Nesterov Accelerated Gradient (NAG)
4/19/2023 5:45 AM <dir> 48.11 - OptimizersAdaGrad
4/19/2023 4:28 AM <dir> 48.12 - Optimizers Adadelta andRMSProp
4/19/2023 5:45 AM <dir> 48.13 - Adam
4/19/2023 5:45 AM <dir> 48.14 - Which algorithm to choose when
4/19/2023 5:45 AM <dir> 48.15 - Gradient Checking and clipping
4/19/2023 5:45 AM <dir> 48.16 - Softmax and Cross-entropy for multi-class classification
4/19/2023 5:45 AM <dir> 48.17 - How to train a Deep MLP
4/19/2023 5:45 AM <dir> 48.18 - Auto Encoders
4/19/2023 5:45 AM <dir> 48.19 - Word2Vec CBOW
4/19/2023 5:45 AM <dir> 48.2 - Dropout layers & Regularization
4/19/2023 5:45 AM <dir> 48.20 - Word2Vec Skip-gram
4/19/2023 5:45 AM <dir> 48.21 - Word2Vec Algorithmic Optimizations
4/19/2023 5:45 AM <dir> 48.3 - Rectified Linear Units (ReLU)
4/19/2023 5:45 AM <dir> 48.4 - Weight initialization
4/19/2023 5:45 AM <dir> 48.5 - Batch Normalization
4/19/2023 5:45 AM <dir> 48.6 - OptimizersHill-descent analogy in 2D
4/19/2023 5:45 AM <dir> 48.7 - OptimizersHill descent in 3D and contours
4/19/2023 5:45 AM <dir> 48.8 - SGD Recap
4/19/2023 5:45 AM <dir> 48.9 - Batch SGD with momentum
4/19/2023 5:45 AM <dir> 49.1 - Tensorflow and Keras overview
4/19/2023 5:45 AM <dir> 49.10 - Model 3 Batch Normalization
4/19/2023 5:45 AM <dir> 49.11 - Model 4 Dropout
4/19/2023 5:45 AM <dir> 49.12 - MNIST classification in Keras
4/19/2023 5:45 AM <dir> 49.13 - Hyperparameter tuning in Keras
4/19/2023 5:45 AM <dir> 49.14 - Exercise Try different MLP architectures on MNIST dataset
4/19/2023 5:45 AM <dir> 49.2 - GPU vs CPU for Deep Learning
4/19/2023 5:45 AM <dir> 49.3 - Google Colaboratory
4/19/2023 5:45 AM <dir> 49.4 - Install TensorFlow
4/19/2023 5:45 AM <dir> 49.5 - Online documentation and tutorials
4/19/2023 5:45 AM <dir> 49.6 - Softmax Classifier on MNIST dataset
4/19/2023 5:45 AM <dir> 49.7 - MLP Initialization
4/19/2023 5:45 AM <dir> 49.8 - Model 1 Sigmoid activation
4/19/2023 5:45 AM <dir> 49.9 - Model 2 ReLU activation
4/19/2023 5:45 AM <dir> 5.1 - Numpy Introduction
4/19/2023 5:45 AM <dir> 5.2 - Numerical operations on Numpy
4/19/2023 5:45 AM <dir> 50.1 - Biological inspiration Visual Cortex
4/19/2023 5:45 AM <dir> 50.10 - Data Augmentation
4/19/2023 5:45 AM <dir> 50.11 - Convolution Layers in Keras
4/19/2023 5:45 AM <dir> 50.12 - AlexNet
4/19/2023 5:45 AM <dir> 50.13 - VGGNet
4/19/2023 4:29 AM <dir> 50.14 - Residual Network
4/19/2023 5:45 AM <dir> 50.15 - Inception Network
4/19/2023 5:45 AM <dir> 50.16 - What is Transfer learning
4/19/2023 5:45 AM <dir> 50.17 - Code example Cats vs Dogs
4/19/2023 5:45 AM <dir> 50.18 - Code Example MNIST dataset
4/19/2023 5:45 AM <dir> 50.19 - Assignment Try various CNN networks on MNIST dataset#
4/19/2023 5:45 AM <dir> 50.2 - ConvolutionEdge Detection on images
4/19/2023 5:46 AM <dir> 50.3 - ConvolutionPadding and strides
4/19/2023 5:46 AM <dir> 50.4 - Convolution over RGB images
4/19/2023 5:46 AM <dir> 50.5 - Convolutional layer
4/19/2023 5:46 AM <dir> 50.6 - Max-pooling
4/19/2023 5:46 AM <dir> 50.7 - CNN Training Optimization
4/19/2023 5:46 AM <dir> 50.8 - Example CNN LeNet [1998]
4/19/2023 5:46 AM <dir> 50.9 - ImageNet dataset
4/19/2023 5:46 AM <dir> 51.1 - Why RNNs
4/19/2023 5:46 AM <dir> 51.10 - Code example IMDB Sentiment classification
4/19/2023 5:46 AM <dir> 51.11 - Exercise Amazon Fine Food reviews LSTM model
4/19/2023 5:46 AM <dir> 51.2 - Recurrent Neural Network
4/19/2023 5:46 AM <dir> 51.3 - Training RNNs Backprop
4/19/2023 5:46 AM <dir> 51.4 - Types of RNNs
4/19/2023 5:46 AM <dir> 51.5 - Need for LSTMGRU
4/19/2023 5:46 AM <dir> 51.6 - LSTM
4/19/2023 5:46 AM <dir> 51.7 - GRUs
4/19/2023 5:46 AM <dir> 51.8 - Deep RNN
4/19/2023 5:46 AM <dir> 51.9 - Bidirectional RNN
4/19/2023 5:46 AM <dir> 52.1 - Questions and Answers
4/19/2023 5:46 AM <dir> 53.1 - Self Driving Car Problem definition
4/19/2023 5:46 AM <dir> 53.10 - NVIDIA’s end to end CNN model
4/19/2023 5:46 AM <dir> 53.11 - Train the model
4/19/2023 5:46 AM <dir> 53.12 - Test and visualize the output
4/19/2023 5:46 AM <dir> 53.13 - Extensions
4/19/2023 5:46 AM <dir> 53.14 - Assignment
4/19/2023 5:46 AM <dir> 53.2 - Datasets
4/19/2023 5:46 AM <dir> 53.2 - Datasets#
4/19/2023 5:46 AM <dir> 53.3 - Data understanding & Analysis Files and folders
4/19/2023 5:46 AM <dir> 53.4 - Dash-cam images and steering angles
4/19/2023 5:46 AM <dir> 53.5 - Split the dataset Train vs Test
4/19/2023 5:46 AM <dir> 53.6 - EDA Steering angles
4/19/2023 5:46 AM <dir> 53.7 - Mean Baseline model simple
4/19/2023 5:46 AM <dir> 53.8 - Deep-learning modelDeep Learning for regression CNN, CNN+RNN
4/19/2023 5:46 AM <dir> 53.9 - Batch load the dataset
4/19/2023 5:46 AM <dir> 54.1 - Real-world problem
4/19/2023 5:46 AM <dir> 54.10 - MIDI music generation
4/19/2023 5:46 AM <dir> 54.11 - Survey blog
4/19/2023 5:46 AM <dir> 54.2 - Music representation
4/19/2023 5:46 AM <dir> 54.3 - Char-RNN with abc-notation Char-RNN model
4/19/2023 4:29 AM <dir> 54.4 - Char-RNN with abc-notation Data preparation
4/19/2023 5:46 AM <dir> 54.5 - Char-RNN with abc-notationMany to Many RNN ,TimeDistributed-Dense layer
4/19/2023 5:46 AM <dir> 54.6 - Char-RNN with abc-notation State full RNN
4/19/2023 5:46 AM <dir> 54.7 - Char-RNN with abc-notation Model architecture,Model training
4/19/2023 5:46 AM <dir> 54.8 - Char-RNN with abc-notation Music generation
4/19/2023 5:46 AM <dir> 54.9 - Char-RNN with abc-notation Generate tabla music
4/19/2023 5:46 AM <dir> 55.1 - Human Activity Recognition Problem definition
4/19/2023 5:46 AM <dir> 55.2 - Dataset understanding
4/19/2023 5:46 AM <dir> 55.3 - Data cleaning & preprocessing
4/19/2023 5:46 AM <dir> 55.4 - EDAUnivariate analysis
4/19/2023 5:46 AM <dir> 55.5 - EDAData visualization using t-SNE
4/19/2023 5:46 AM <dir> 55.6 - Classical ML models
4/19/2023 5:46 AM <dir> 55.7 - Deep-learning Model
4/19/2023 5:46 AM <dir> 55.8 - Exercise Build deeper LSTM models and hyper-param tune them
4/19/2023 5:46 AM <dir> 56.1 - Problem definition
4/19/2023 5:46 AM <dir> 56.10 - Feature engineering on GraphsJaccard & Cosine Similarities
4/19/2023 5:46 AM <dir> 56.11 - PageRank
4/19/2023 5:46 AM <dir> 56.12 - Shortest Path
4/19/2023 5:46 AM <dir> 56.13 - Connected-components
4/19/2023 5:46 AM <dir> 56.14 - Adar Index
4/19/2023 5:46 AM <dir> 56.15 - Kartz Centrality
4/19/2023 5:46 AM <dir> 56.16 - HITS Score
4/19/2023 5:46 AM <dir> 56.17 - SVD
4/19/2023 5:46 AM <dir> 56.18 - Weight features
4/19/2023 5:46 AM <dir> 56.19 - Modeling
4/19/2023 5:46 AM <dir> 56.2 - Overview of Graphs nodevertex, edgelink, directed-edge, path
4/19/2023 5:46 AM <dir> 56.3 - Data format & Limitations
4/19/2023 5:46 AM <dir> 56.4 - Mapping to a supervised classification problem
4/19/2023 5:46 AM <dir> 56.5 - Business constraints & Metrics
4/19/2023 5:46 AM <dir> 56.6 - EDABasic Stats
4/19/2023 5:46 AM <dir> 56.7 - EDAFollower and following stats
4/19/2023 5:46 AM <dir> 56.8 - EDABinary Classification Task
4/19/2023 5:46 AM <dir> 56.9 - EDATrain and test split
4/19/2023 5:46 AM <dir> 57.1 - Introduction to Databases
4/19/2023 5:46 AM <dir> 57.10 - ORDER BY
4/19/2023 5:46 AM <dir> 57.11 - DISTINCT
4/19/2023 5:46 AM <dir> 57.12 - WHERE, Comparison operators, NULL
4/19/2023 5:46 AM <dir> 57.13 - Logical Operators
4/19/2023 5:46 AM <dir> 57.14 - Aggregate Functions COUNT, MIN, MAX, AVG, SUM
4/19/2023 5:46 AM <dir> 57.15 - GROUP BY
4/19/2023 5:46 AM <dir> 57.16 - HAVING
4/19/2023 5:46 AM <dir> 57.17 - Order of keywords#
4/19/2023 5:46 AM <dir> 57.18 - Join and Natural Join
4/19/2023 5:46 AM <dir> 57.19 - Inner, Left, Right and Outer joins
4/19/2023 5:46 AM <dir> 57.2 - Why SQL
4/19/2023 5:46 AM <dir> 57.20 - Sub QueriesNested QueriesInner Queries
4/19/2023 5:46 AM <dir> 57.21 - DMLINSERT
4/19/2023 5:46 AM <dir> 57.22 - DMLUPDATE , DELETE
4/19/2023 5:46 AM <dir> 57.23 - DDLCREATE TABLE
4/19/2023 5:46 AM <dir> 57.24 - DDLALTER ADD, MODIFY, DROP
4/19/2023 5:46 AM <dir> 57.25 - DDLDROP TABLE, TRUNCATE, DELETE
4/19/2023 5:46 AM <dir> 57.26 - Data Control Language GRANT, REVOKE
4/19/2023 5:46 AM <dir> 57.27 - Learning resources
4/19/2023 5:46 AM <dir> 57.3 - Execution of an SQL statement
4/19/2023 5:46 AM <dir> 57.4 - IMDB dataset
4/19/2023 5:46 AM <dir> 57.5 - Installing MySQL
4/19/2023 5:46 AM <dir> 57.6 - Load IMDB data
4/19/2023 4:30 AM <dir> 57.7 - USE, DESCRIBE, SHOW TABLES
4/19/2023 5:46 AM <dir> 57.8 - SELECT
4/19/2023 5:46 AM <dir> 57.9 - LIMIT, OFFSET
4/19/2023 5:46 AM <dir> 58.1 - AD-Click Predicition
4/19/2023 5:46 AM <dir> 59.1 - Revision Questions
4/19/2023 5:46 AM <dir> 59.2 - Questions
4/19/2023 5:46 AM <dir> 59.3 - External resources for Interview Questions
4/19/2023 5:46 AM <dir> 6.1 - Getting started with Matplotlib
4/19/2023 5:46 AM <dir> 7.1 - Getting started with pandas
4/19/2023 5:46 AM <dir> 7.2 - Data Frame Basics
4/19/2023 5:46 AM <dir> 7.3 - Key Operations on Data Frames
4/19/2023 5:46 AM <dir> 8.1 - Space and Time Complexity Find largest number in a list
4/19/2023 5:46 AM <dir> 8.2 - Binary search
4/19/2023 5:47 AM <dir> 8.3 - Find elements common in two lists
4/19/2023 5:47 AM <dir> 8.4 - Find elements common in two lists using a HashtableDict
4/19/2023 5:47 AM <dir> 9.1 - Introduction to IRIS dataset and 2D scatter plot
4/19/2023 5:47 AM <dir> 9.10 - Percentiles and Quantiles
4/19/2023 5:47 AM <dir> 9.11 - IQR(Inter Quartile Range) and MAD(Median Absolute Deviation)
4/19/2023 5:47 AM <dir> 9.12 - Box-plot with Whiskers
4/19/2023 5:47 AM <dir> 9.13 - Violin Plots
4/19/2023 5:47 AM <dir> 9.14 - Summarizing Plots, Univariate, Bivariate and Multivariate analysis
4/19/2023 5:47 AM <dir> 9.15 - Multivariate Probability Density, Contour Plot
4/19/2023 5:47 AM <dir> 9.16 - Exercise Perform EDA on Haberman dataset
4/19/2023 5:47 AM <dir> 9.2 - 3D scatter plot
4/19/2023 5:47 AM <dir> 9.3 - Pair plots
4/19/2023 5:47 AM <dir> 9.4 - Limitations of Pair Plots
4/19/2023 5:47 AM <dir> 9.5 - Histogram and Introduction to PDF(Probability Density Function)
4/19/2023 5:47 AM <dir> 9.6 - Univariate Analysis using PDF
4/19/2023 5:47 AM <dir> 9.7 - CDF(Cumulative Distribution Function)
4/19/2023 5:47 AM <dir> 9.8 - Mean, Variance and Standard Deviation
4/19/2023 5:47 AM <dir> 9.9 - Median
2/26/2023 2:35 AM 133 [FreeCoursesOnline.Me].url
2/26/2023 2:35 AM 129 [FreeTutorials.Eu].url
2/26/2023 2:35 AM 1374 [FTU Forum].url
2/26/2023 2:35 AM 94 FTUApps.com website coming soon.txt
2/26/2023 2:35 AM 241 How you can help Team-FTU.txt